12 research outputs found

    Testing the utility of a data-driven approach for assessing BMI from face images

    Get PDF
    Several lines of evidence suggest that facial cues of adiposity may be important for human social interaction. However, tests for quantifiable cues of body mass index (BMI) in the face have examined only a small number of facial proportions and these proportions were found to have relatively low predictive power. Here we employed a data-driven approach in which statistical models were built using principal components (PCs) derived from objectively defined shape and color characteristics in face images. The predictive power of these models was then compared with models based on previously studied facial proportions (perimeter-to-area ratio, width-to-height ratio, and cheek-to-jaw width). Models based on 2D shape-only PCs, color-only PCs, and 2D shape and color PCs combined each performed significantly and substantially better than models based on one or more of the previously studied facial proportions. A non-linear PC model considering both 2D shape and color PCs was the best predictor of BMI. These results highlight the utility of a “bottom-up”, data-driven approach for assessing BMI from face images

    Interpretation of appearance: the effect of facial features on first impressions and personality.

    Get PDF
    Appearance is known to influence social interactions, which in turn could potentially influence personality development. In this study we focus on discovering the relationship between self-reported personality traits, first impressions and facial characteristics. The results reveal that several personality traits can be read above chance from a face, and that facial features influence first impressions. Despite the former, our prediction model fails to reliably infer personality traits from either facial features or first impressions. First impressions, however, could be inferred more reliably from facial features. We have generated artificial, extreme faces visualising the characteristics having an effect on first impressions for several traits. Conclusively, we find a relationship between first impressions, some personality traits and facial features and consolidate that people on average assess a given face in a highly similar manner

    Overview of analyses.

    No full text
    <p>A schematic of the procedures for calculating facial metrics (A) and principal components (B) and the subsequent prediction of BMI measures.</p

    Comparison of model performance.

    No full text
    <p>The distribution of r<sup>2</sup> values for the 30 repeats of each model. In these violin plots [<a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0140347#pone.0140347.ref030" target="_blank">30</a>,<a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0140347#pone.0140347.ref031" target="_blank">31</a>], the white dot shows the median value, the thick black bars span the first to the third quartiles, the whiskers span 1.5 times the interquartile range, and the red bars show the distribution of values.</p

    Prediction of <i>Ratings</i> from facial features.

    No full text
    <p>The plot shows the average correlation coefficient and standard deviation between observed and predicted scores for each <i>Rating</i> and each gender. A linear regression model was built in a 20-fold cross-validation with a varying number of the most correlated facial components as predictors, chosen based on the training set. Standard deviations are gathered by running the calculations thirty times with different folds for each run. The Ratings are in the plot ordered based on performance for the male faces. The size of the points indicates the Cronbach's α for that trait and it is seen that larger α-values correlate positively with prediction performance. Abbreviations for the <i>Ratings</i> are: Trustw.  =  Trustworthy, Adv.  =  Adventurous, Temp.  =  Temperamental, Healthy  =  Physically Healthy, Ext.  =  Extraverted, Dom.  =  Dominating, Att.  =  Attractive, Masc.  =  Masculine, Em. Stab.  =  Emotionally Stable, Resp.  =  Responsible and Int.  =  Intelligent.</p

    Network graph of all significant correlations between <i>Ratings</i>.

    No full text
    <p>The network depicts the relationship between the individual <i>Ratings</i> as the correlation coefficient, r, between scores. A dashed line depicts negative and a solid line positive correlations and the thickness of the line indicates the strength of the relationship with r as the edge label. Relationships significant for both genders are black, for men blue and for women magenta. Three clusters can be seen in the network with <i>Trustworthy, Responsible, Friendly</i> and <i>Intelligent</i> in the first, <i>Extraverted</i>, <i>Adventurous, Emotionally Stable, Attractive</i> and <i>Physically Healthy</i> in the second and <i>Temperamental, Dominating and Masculine</i> in the third. We named the clusters trustworthiness-friendliness, attractiveness-health-extraversion and dominance-masculinity.</p

    Example of two facial features, PC2 and PC13, and their interaction.

    No full text
    <p>The faces visualise how two principal components, PC, extracted by an Appearance Model, interact with each other. The coordinate system shows the change in a face when a principal component is moved two standard deviations in either the positive or the negative direction. The face in the middle shows the mean for all factors. E.g. the face in the upper right shows PC2 and PC13 at +2 standard deviations. It is seen that PC13 explains the shape of the mouth and PC2 the face width.</p

    Validation of extreme faces.

    No full text
    <p>In the validation we presented four faces to 116 persons and asked them to choose which one they found to represent a given trait the most. The left plot shows results for the male extremes and the right results for the female extremes. The length of each section in each bar indicates the percentage of times the given face was chosen. The dotted line indicates the percentage representing a random selection of the extreme face. In all cases except one the extreme face was chosen more often than random. For the male faces we found the extremes to be chosen significantly over random. For the female faces this was only found for the <i>Friendly</i> and <i>Adventurous</i> extremes. The colours are from <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0107721#pone.0107721-Brewer1" target="_blank">[33]</a>.</p

    Extreme faces for the <i>Ratings</i>.

    No full text
    <p>For each face pair the left extreme face is predicted as being judged very low for a given trait and the right face as very high. Each face is based on the β-coefficients from the best linear regression model for that given Rating and gender. We generated the faces by multiplying each β-coefficient to either +4 standard deviations or -4 standard deviations of the matching facial component. A: Male extremes for <i>Adventurous</i>. B: Male extremes for <i>Friendly</i>. C: Male extremes for <i>Dominating</i>. D: Female extremes for <i>Adventurous</i>. E) Female extremes for <i>Trustworthy</i>. F: Female extremes for <i>Dominating</i>.</p
    corecore